17 research outputs found

    Towards Machine Wald

    Get PDF
    The past century has seen a steady increase in the need of estimating and predicting complex systems and making (possibly critical) decisions with limited information. Although computers have made possible the numerical evaluation of sophisticated statistical models, these models are still designed \emph{by humans} because there is currently no known recipe or algorithm for dividing the design of a statistical model into a sequence of arithmetic operations. Indeed enabling computers to \emph{think} as \emph{humans} have the ability to do when faced with uncertainty is challenging in several major ways: (1) Finding optimal statistical models remains to be formulated as a well posed problem when information on the system of interest is incomplete and comes in the form of a complex combination of sample data, partial knowledge of constitutive relations and a limited description of the distribution of input random variables. (2) The space of admissible scenarios along with the space of relevant information, assumptions, and/or beliefs, tend to be infinite dimensional, whereas calculus on a computer is necessarily discrete and finite. With this purpose, this paper explores the foundations of a rigorous framework for the scientific computation of optimal statistical estimators/models and reviews their connections with Decision Theory, Machine Learning, Bayesian Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty Quantification and Information Based Complexity.Comment: 37 page

    Trends in the Statistical Assessment of Reliability

    Get PDF
    Changes in technology have had and will continue to have a strong effect on changes in the area of statistical assessment of reliability data. These changes include higher levels of integration in electronics, improvements in measurement technology and the deployment of sensors and smart chips into more products, dramatically improved computing power and storage technology, and the development of new, powerful statistical methods for graphics, inference, and experimental design and reliability test planning. This paper traces some of the history of the development of statistical methods for reliability assessment and makes some predictions about the future

    Stochastic interdependence, possibility and probabilistic causality

    Get PDF

    Probabilistic fuzzy systems as additive fuzzy systems

    No full text
    Probabilistic fuzzy systems combine a linguistic description of the system behaviour with statistical properties of data. It was originally derived based on Zadeh’s concept of probability of a fuzzy event. Two possible and equivalent additive reasoning schemes were proposed, that lead to the estimation of the output’s conditional probability density. In this work we take a complementary approach and derive a probabilistic fuzzy system from an additive fuzzy system. We show that some fuzzy systems with universal approximation capabilities can compute the same expected output value as probabilistic fuzzy systems and discuss some similarities and differences between them. A practical relevance of this functional equivalence result is that learning algorithms, optimization techniques and design issues can, under certain circumstances, be transferred across different paradigms
    corecore